555 research outputs found

    The STRESS Method for Boundary-point Performance Analysis of End-to-end Multicast Timer-Suppression Mechanisms

    Full text link
    Evaluation of Internet protocols usually uses random scenarios or scenarios based on designers' intuition. Such approach may be useful for average-case analysis but does not cover boundary-point (worst or best-case) scenarios. To synthesize boundary-point scenarios a more systematic approach is needed.In this paper, we present a method for automatic synthesis of worst and best case scenarios for protocol boundary-point evaluation. Our method uses a fault-oriented test generation (FOTG) algorithm for searching the protocol and system state space to synthesize these scenarios. The algorithm is based on a global finite state machine (FSM) model. We extend the algorithm with timing semantics to handle end-to-end delays and address performance criteria. We introduce the notion of a virtual LAN to represent delays of the underlying multicast distribution tree. The algorithms used in our method utilize implicit backward search using branch and bound techniques and start from given target events. This aims to reduce the search complexity drastically. As a case study, we use our method to evaluate variants of the timer suppression mechanism, used in various multicast protocols, with respect to two performance criteria: overhead of response messages and response time. Simulation results for reliable multicast protocols show that our method provides a scalable way for synthesizing worst-case scenarios automatically. Results obtained using stress scenarios differ dramatically from those obtained through average-case analyses. We hope for our method to serve as a model for applying systematic scenario generation to other multicast protocols.Comment: 24 pages, 10 figures, IEEE/ACM Transactions on Networking (ToN) [To appear

    Beyond Classification: Latent User Interests Profiling from Visual Contents Analysis

    Full text link
    User preference profiling is an important task in modern online social networks (OSN). With the proliferation of image-centric social platforms, such as Pinterest, visual contents have become one of the most informative data streams for understanding user preferences. Traditional approaches usually treat visual content analysis as a general classification problem where one or more labels are assigned to each image. Although such an approach simplifies the process of image analysis, it misses the rich context and visual cues that play an important role in people's perception of images. In this paper, we explore the possibilities of learning a user's latent visual preferences directly from image contents. We propose a distance metric learning method based on Deep Convolutional Neural Networks (CNN) to directly extract similarity information from visual contents and use the derived distance metric to mine individual users' fine-grained visual preferences. Through our preliminary experiments using data from 5,790 Pinterest users, we show that even for the images within the same category, each user possesses distinct and individually-identifiable visual preferences that are consistent over their lifetime. Our results underscore the untapped potential of finer-grained visual preference profiling in understanding users' preferences.Comment: 2015 IEEE 15th International Conference on Data Mining Workshop

    Data communications via cable television networks : technical and policy considerations

    Get PDF
    Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1983.MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERINGBibliography: leaves 144-151.by Deborah Lynn Estrin.M.S

    Recovering Temporal Integrity with Data Driven Time Synchronization

    Get PDF
    Data Driven Time Synchronization (DDTS) provides synchronization across sensors by using underlying characteristics of data collected by an embedded sensing sys- tem. We apply the concept of Data Driven Time Synchronization through a seismic deployment consisting of 100 seismic sensors to repair data that was not time synchronized correctly. This deployment used GPS for time synchronization but due to system faults common to environmental sensing systems, data was collected with large time offsets. In seismic deployments, offset data is often never used but we show that Data Driven Time Synchronization can recover the synchronization and make the data usable. To implement Data Driven Time Synchronization to repair the time offsets we use microseisms as the underlying characteristics. Microseisms are waves that travel through the earth’s crust and are independent of the seismic events used for the study of the earth’s structure. We have developed a model of microseism propagation through a linear seismic array and use the model to obtain time correction shifts. By simulating time offsets in real data which does not have offsets, we determined that this method is able to repair the offset to less than 0.2 seconds. Our ongoing work will attempt to refine the model to correct the offsets to 0.05 seconds and evaluate how errors in the correction affect seismic results such as event location. Data Driven Time Synchronization may be applicable to other high data rate embedded sensing applications such as acoustic source localization
    • …
    corecore